Current:Home > NewsDemocrats Want To Hold Social Media Companies Responsible For Health Misinformation -TradeCircle
Democrats Want To Hold Social Media Companies Responsible For Health Misinformation
View
Date:2025-04-23 14:28:56
Democratic senators introduced a bill on Thursday that would hold Facebook, YouTube and other social media companies responsible for the proliferation of falsehoods about vaccines, fake cures and other harmful health-related claims on their sites.
Co-sponsored by Democratic Senators Amy Klobuchar of Minnesota and Ben Ray Luján of New Mexico, the Health Misinformation Act targets a provision in Section 230 of the Communications Decency Act, which protects platforms from being held liable for what their users post in most cases.
The bill would strip the companies of that legal shield if their algorithms promote health misinformation during a public health crisis. It would not apply if such misinformation is shown in a chronological feed. (Most social platforms use algorithms to rank posts based on what they think users will be interested in,)
The legislation leaves it up to the U.S. Department of Health and Human Services, which is responsible for declaring public health emergencies, to define what constitutes health misinformation.
"These are some of the biggest, richest companies in the world and they must do more to prevent the spread of deadly vaccine misinformation," Klobuchar said in a statement. "The coronavirus pandemic has shown us how lethal misinformation can be and it is our responsibility to take action."
She cited a recent poll from the nonprofit Kaiser Family Foundation that found two-thirds of unvaccinated people believe myths about COVID-19 vaccines, such as the baseless notion that vaccines cause the disease.
Tensions over social media's role in the spread of fraudulent claims about COVID-19 vaccines have come to a head as stalling vaccination rates and the rise of the Delta variant threaten to prolong the pandemic.
U.S. Surgeon General Vivek Murthy warned last week that COVID-19 misinformation is an "urgent threat." The White House has called out Facebook in particular, saying it needs to do more to curb false anti-vaccination posts.
On Friday, President Biden said the social platforms are "killing people," although he later walked back that comment and said he meant that people who spread misinformation about vaccines online are irresponsible.
Facebook lashed back, accusing the administration of "finger pointing". The company said it's taken down more than 18 million pieces of COVID misinformation, showed authoritative information about COVID and vaccines to more than 2 billion people, and that its own surveys find "vaccine acceptance among Facebook users in the U.S. has increased."
CEO Mark Zuckerberg told the website The Verge on Thursday that he is "quite confident" that the company has been "a positive force here."
This week, YouTube said it would start putting notices on some videos about health with links to "authoritative" sources, and highlight videos from those sources in search results on certain health topics.
But critics say social media companies need to go further. The Center for Countering Digital Hate, a nonprofit that combats disinformation, says just 12 people are responsible for 65% of anti-vaccine posts shared on social media, and has criticized Facebook, YouTube and Twitter for not booting them off their platforms entirely. (The platforms have removed some accounts of a few of the dozen, but none has kicked all of them off.)
Klobuchar said the 25-year-old Section 230 is allowing misinformation to thrive online.
"The law — which was intended to promote online speech and allow online services to grow — now distorts legal incentives for platforms to respond to digital misinformation on critical health issues, like COVID-19, and leaves people who suffer harm with little to no recourse," Klobuchar's office said in a press release about her proposed legislation.
The White House has also said it's "reviewing" whether to seek changes to Section 230 to tackle COVID misinformation.
The legal shield has come under fire from lawmakers on both sides of the aisle who say it's become outdated now that tech platforms play such a dominant role in society.
Democrats say Section 230 allows social media companies to duck responsibility for harmful content like misinformation and hate speech, while Republicans claim it's given platforms cover to censor conservatives. (There is little public evidence showing platforms treat conservatives more harshly than others.)
However, trying to hold the platforms responsible for health misinformation may come up against challenges on First Amendment grounds, because such content likely falls into the category of "lawful but awful" speech, said Eric Goldman, a law professor at Santa Clara University.
"If health misinformation is constitutionally protected, then there's really not a lot Congress can do about that," he said. "Removing Section 230, which is a liability shield, doesn't expose a [social media] service to any new liability, because the Constitution will fill in the protection."
Defining which health claims are legitimate and which aren't also raises thorny issues, said Renée DiResta, who studies misinformation at the Stanford Internet Observatory.
"There are times when the consensus just isn't fully formed yet," she said, pointing to the early days of the pandemic when there were debates about whether the virus was airborne.
"Asking or expecting the platforms to take action on certain types of health misinformation may be reasonable, but this sort of dynamic that we've all watched unfold over the last year and a half makes clear how this approach has some potentially problematic pitfalls," she said.
DiResta also warned that focusing narrowly on how platforms handle specific types of content, whether it's lies about vaccines or baseless claims of election fraud, risks ignoring the bigger picture of how information is created and spread — both on social media and in other channels.
"There's hope that we can cure all of the problems of the world by amending [Section 230]," she said. "It's not so simple as we're going to regulate social media platforms and this is all going to go away."
Editor's note: Facebook and Google, which owns YouTube, are among NPR's financial supporters.
veryGood! (9933)
Related
- Taylor Swift Eras Archive site launches on singer's 35th birthday. What is it?
- Petrochemical giant’s salt mine ruptures in northeastern Brazil. Officials warn of collapse
- Anna Chickadee Cardwell, Daughter of Mama June Shannon, Dead at 29 After Cancer Battle
- 6 teens convicted over their roles in teacher's beheading in France
- Trump invites nearly all federal workers to quit now, get paid through September
- Watch Hip-Hop At 50: Born in the Bronx, a CBS New York special presentation
- Explosions heard in Kyiv in possible air attack; no word on damage or casualties
- Ryan O'Neal, Oscar-nominated actor from 'Love Story,' dies at 82: 'Hollywood legend'
- Whoopi Goldberg is delightfully vile as Miss Hannigan in ‘Annie’ stage return
- Adam McKay accused of ripping off 2012 book to create Oscar-nominated film 'Don't Look Up'
Ranking
- Google unveils a quantum chip. Could it help unlock the universe's deepest secrets?
- Some nations want to remove more pollution than they produce. That will take giving nature a boost
- Here's What to Give the Man in Your Life to Sneakily Upgrade His Style For the Holidays
- Zelenskyy will meet Biden at the White House amid a stepped-up push for Congress to approve more aid
- Krispy Kreme offers a free dozen Grinch green doughnuts: When to get the deal
- Sean 'Diddy' Combs lawsuits show how sexual assault survivors can leverage public opinion
- NFL’s Tony Romo Refers to Taylor Swift as Travis Kelce’s “Wife” During Chiefs Game
- Polling centers open in Egypt’s presidential elections
Recommendation
'Vanderpump Rules' star DJ James Kennedy arrested on domestic violence charges
Guyana agreed to talks with Venezuela over territorial dispute under pressure from Brazil, others
Kevin McCallister’s grocery haul in 1990 'Home Alone' was $20. See what it would cost now.
Embattled wolves gain a new frontier in Democratic Colorado. The move is stoking political tensions
Appeals court scraps Nasdaq boardroom diversity rules in latest DEI setback
MLB free agency: Five deals that should happen with Shohei Ohtani off the board
Bronny James makes college debut for USC nearly 5 months after cardiac arrest
At least 6 dead after severe storms, tornadoes hit Tennessee, leave trail of damage